48 research outputs found

    An automatic correction of Ma's thinning algorithm based on P -simple points

    Get PDF
    International audienceThe notion of P -simple points has been introduced by Bertrand to conceive parallel thinning algorithms. In 'A 3D fully parallel thinning algorithm for generating medial faces', Ma has proposed an algorithm for which there exists objects whose topology is not preserved. In this paper, we propose a new application of P -simple points: to automatically correct Ma's algorithm

    Automatic correction of Ma and Sonka's thinning algorithm using P-simple points

    Get PDF
    International audienceMa and Sonka proposed a fully parallel 3D thinning algorithm which does not always preserve topology. We propose an algorithm based on P-simple points which automatically corrects Ma and Sonka's Algorithm. As far as we know, our algorithm is the only fully parallel curve thinning algorithm which preserves topology

    Automatic Correction of Ma and Sonka's Thinning Algorithm Using P-Simple Points

    Full text link

    Interventional planning and assistance for ascending aorta dissections

    Get PDF
    International audienceIn this paper, we present our global image processing framework of interventional planning and assistance for ascending aorta dissections. The preoperative stage of that framework performs the extraction of aortic dissection features in Computed Tomography Angiography (CTA) images. It mainly consists of a customized fast marching segmentation. The intraoperative stage of that framework realizes medical images registration and proposes data visualization enhancement; standard X-ray fluoroscopic images are used as the reference modality. We use our recently introduced registration method based on image transformation descriptors (ITDs) and usual 3D/2D techniques (based on digitally reconstructed radiographs). The first stage provides aortic dissection features and is to help clinicians for the planning. The second stage provides an augmented reality visualization and would be used for assistance during the intervention. As far as we know, this is the first complete image processing framework which focuses on the ascending aorta dissection (minimally invasive) endovascular treatment

    Presentation of several 3D thinning schemes based on Bertrand's P-simple points and critical kernels notions

    Get PDF
    International audienceWithin the framework of Digital Topology, Gilles Bertrand has first proposed the notion of topological number. By computing two of these numbers, we can efficiently and locally verify whether a point is simple or not, i.e., whether its deletion of an object preserves both the number of connected components and the number of holes in the image (global concepts). In this case, we say that the topology of the image is well preserved. Several algorithms based on the removal of simple points have been proposed in order to simplify objects in images, they are called thinning or skeletonization algorithms. Results produced by such algorithms are called skeletons. An algorithm based on the sequential deletion of simple points automatically preserves the topology. If it operates by the simultaneous deletion of simple points, a thinning algorithm must then implements different deletion strategies in order to preserve the topology (for example, do not simultaneously remove two adjacent simple points at one time-subfields strategy). Gilles Bertrand has then proposed the concept of P-simple point, requiring the definition of a set P inside an object. Once the points of P are labeled in the image, we may locally characterize (by using topological numbers) whether a point is P-simple or not. This is a major contribution in the field of Digital Topology in several ways: - an algorithm removing only P-simple points automatically preserves the topology, no additional proof is required unlike most algorithms that were proposed before the P-simple points introduction, - if the points deleted by an existing algorithm are P-simple with P precisely being all the points deleted by the chosen algorithm, then the algorithm well preserves the topology. Otherwise, it requires a thorough examination. Then, Bertrand has introduced critical kernels as a general framework to design thinning shemes in the category of abstract complexes. In fact, critical kernels constitute a generalization of P-simple points. In this talk, several thinning schemes, based on either P-simple points or critical kernels will be recalled

    Contribution à l'analyse topologique des images : étude d'algorithmes de squelettisation pour images 2D et 3D, selon une approche topologie digitale ou topologie discrète.

    No full text
    This thesis proposes new thinning algorithms for 2D or 3D images according to two approaches using either the digital topology or the discrete topology. In the first part, we recall some fundamental notions of digital topology and several thinning algorithms among the well-known ones, which delete simple points. Then, we propose a methodology to produce new thinning algorithms based on the parallel deletion of P-simple points. Such algorithms are conceived in order to they delete at least the points removed by another one given existent thinning algorithm. By applying this methodology, we produce two new algorithms. Although results seem to be satisfying, the proposal and encoding of new proposed algorithms are not easy. In the second part, we use the concept of partially order set (or poset). We propose more straightforwardly than before, a thinning algorithm consisting in the repetition of parallel deletion of alpha_n-simple points, followed by the parallel deletion of beta_n-simple points. We also have proposed new definitions of end points which permits us to obtain either curve skeletons or surface skeletons. The thinning scheme is used on 2D, 3D binary images, or on 2D grayscale images. At last, a study of a parallel filtering of skeletons is developped.Cette thèse propose de nouveaux algorithmes de squelettisation d'images 2D et 3D selon deux approches : l'approche topologie digitale et l'approche topologie discrète. Dans la première partie, nous rappelons les notions fondamentales de topologie digitale et quelques algorithmes de squelettisation parmi les plus connus, basés sur la suppression de certains points simples. Puis, nous proposons une méthodologie permettant la production d'algorithmes de squelettisation fondés sur la suppression en parallèle de points P-simples. De tels algorithmes sont élaborés de façon à ce qu'ils suppriment au moins les points retirés par un autre algorithme donné. Nous appliquons cette méthodologie et produisons deux nouveaux algorithmes. Bien que les résultats semblent satisfaisants, la recherche et la mise en oeuvre de tels algorithmes restent difficiles. Dans la deuxième partie, nous utilisons le cadre mathématique des ordres. De façon plus directe qu'auparavant, nous proposons un algorithme de squelettisation consistant en la répétition de la suppression parallèle de points alpha_n-simples, puis en la suppression de points beta_n-simples. Nous avons également proposé des définitions originales de points terminaux, nous permettant l'obtention de squelettes curvilignes ou surfaciques. Le schéma général de squelettisation est utilisé sur des images 2D, 3D binaires et 2D en niveaux de gris. Enfin, une étude de filtrage parallèle de squelettes est développée

    Poset approach to 3D parallel thinning

    No full text
    One of the authors has proposed a study of homotopy and simplicity in partially ordered sets 1,2 (or posets). The notion of unipolar point was introduced : a unipolar point can be seen as an "inessential" element for the topology. Thus, the iterative deletion of unipolar points constitutes a first thinning algorithm. We show in this paper, that such an algorithm does not "thin enough" certain images. This is the reason why we use the notion of ff-simple point, introduced in the framework of posets, in Ref. 1. The definition of such a point is recursive. As we can locally decide whether a point is ff-simple, we can use classical techniques (such as a binary decision diagram ) to characterize them more quickly. Furthermore, it is possible to remove in parallel ff-simple points in a poset, while preserving the topology of the image. Then, we discuss the characterization of end points in order to produce various skeletons. Particularly, we propose a new approach to characterize surface end points. This approach permits us to keep certain junctions of surfaces. Then, we propose very simple parallel algorithms based on the deletion of ff-simple points, consisting in the repetition of two subiterations

    Contribution à l'analyse topologique des images (étude d'algorithmes de squelettisation pour images 2D et 3D selon une approche topologie digitale ou topologie discrète)

    No full text
    Cette thèse propose de nouveaux algorithmes de squelettisation d'images 2D et 3D selon deux approches : l'approche topologie digitale et l'approche topologie discrète. Dans la première partie, nous rappelons les notions fondamentales de topologie digitale et quelques algorithmes de squelettisation parmi les plus connus, basés sur la suppression de certains points simples. Puis, nous proposons une méthodologie permettant la production d'algorithmes de squelettisation fondés sur la suppression en parallèle de points P-simples. De tels algorithmes sont élaborés de façon à ce qu'ils suppriment au moins les points retirés par un autre algorithme donné. Nous appliquons cette méthodologie et produisons deux nouveaux algorithmes. Bien que les résultats semblent satisfaisants, la recherche et la mise en œuvre de tels algorithmes restent difficiles. Dans la deuxième partie, nous utilisons le cadre mathématique des ordres. De faç on plus directe qu'auparavant, nous proposons un algorithme de squelettisation consistant en la répétition de la suppression parallèle de points (n-simples, puis en la suppression parallèle de points (n-simples. Nous avons également proposé des définitions originales de points terminaux, nous permettant l'obtention de squelettes curvilignes ou surfaciques. Le schéma général de squelettisation est utilisé sur des images 2D, 3D binaires et 2D en niveaux de gris. Enfin, une étude de filtrage parallèle de squelettes est développéeThis thesis proposes new thinning algorithms for 2D or 3D images according to two approaches using either the digital topology or the discrete topology. In the first part, we recall some fundamental notions of digital topology and several thinning algorithms amongs the well-known ones, which delete simple points. Then, we propose a methodology to produce new thinning algorithms based on the parallel deletion of P-simple points. Such algorithms are conceived in order to they delete at least the points removed by another one given existent thinning algorithm. By applying this methodology, we produce two new algorithms. Although results seem to be satisfying, the proposal and encoding of new proposed algorithms are not easy. In the second part, we use the concept of partially order set (or poset). We propose more straightforwardly than before, a thinning algorithm consisting in the repetition of parallel deletion of an-simple points, followed by the parallel deletion of bn-simple points. We also have proposed new definitions of end points which permit us to obtain either curve skeletons or surface skeletons. The thinning scheme is used on 2D, 3D binary images, or on 2D grayscale images. At last, a study of a parallel filtering of skeletons is developpedPARIS-EST Marne-la-Vallee-BU (774682101) / SudocSudocFranceF
    corecore